The aim of this paper is to improve beat-tracking for live guitar performances. Beat-tracking is a function to\r\nestimate musical measurements, for example musical tempo and phase. This method is critical to achieve a\r\nsynchronized ensemble performance such as musical robot accompaniment. Beat-tracking of a live guitar\r\nperformance has to deal with three challenges: tempo fluctuation, beat pattern complexity and environmental\r\nnoise. To cope with these problems, we devise an audiovisual integration method for beat-tracking. The auditory\r\nbeat features are estimated in terms of tactus (phase) and tempo (period) by Spectro-Temporal Pattern Matching\r\n(STPM), robust against stationary noise. The visual beat features are estimated by tracking the position of the hand\r\nrelative to the guitar using optical flow, mean shift and the Hough transform. Both estimated features are\r\nintegrated using a particle filter to aggregate the multimodal information based on a beat location model and a\r\nhand�s trajectory model. Experimental results confirm that our beat-tracking improves the F-measure by 8.9 points\r\non average over the Murata beat-tracking method, which uses STPM and rule-based beat detection. The results\r\nalso show that the system is capable of real-time processing with a suppressed number of particles while\r\npreserving the estimation accuracy. We demonstrate an ensemble with the humanoid HRP-2 that plays the\r\ntheremin with a human guitarist.
Loading....